![]() Container usage estimate
专利摘要:
Embodiments of the invention are primarily aimed at estimating capacity use of a container. In one embodiment the invention relates to a method for estimating that a container is filled. The method comprises: attaching an image pickup device near a container loading area, wherein the image pickup device is operable to record three-dimensional images; recording via the image recording device a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; generating a histogram of the depth data from the three-dimensional image; and estimating the container being filled at least in part based on the histogram. 公开号:BE1025929B1 申请号:E20185902 申请日:2018-12-18 公开日:2020-01-06 发明作者:Justin F Barish;Adithya H Krishnamurthy 申请人:Symbol Technologies Llc; IPC主号:
专利说明:
BACKGROUND Goods can be transported in many different ways using many different methods. Long distance transportation in particular uses containers that can be loaded with goods and then moved by vehicles, trains, vessels, or aircraft to their desired destinations. Although releasable containers are not always used, short-distance goods transport similarly uses delivery trucks / lorries to which containers are attached for storage of items and cargo. In the past, most of the loading and unloading of goods was carried out without a significant contribution from a computerized system. However, with the development of calculation options, the availability of measured environmental data and the ever increasing focus on efficiency, loading and unloading procedures are nowadays monitored, guided and / or assisted by calculation platforms that can act on information immediately. An aspect of particular importance is loading efficiency and being able to estimate how much of the container space is being wasted. Leaving large gaps between freight items, for example, could mean that the container is not filled to capacity. Similarly, failure to fill the container up to the top when goods are loaded from the back to the front also leaves valuable space unoccupied, thereby creating inefficiency. However, a specific problem could occur when working with boxes and / or cargo of certain dimensions and / or type, such as, for example, gaylord-type boxes. Gaylord boxes should generally be understood as large or bulk-sized boxes that are designed to be one BE2018 / 5902 or several (often irregularly shaped) loads. This makes it possible to transport products in a large, single box or on a pallet. Although gaylord boxes can vary in size, they are generally relatively large, thereby preventing the possibility of effectively stacking such boxes close to the ceiling of a container. For example, if a gaylord box is 4 feet tall and less than 4 feet is left up to the ceiling of a container, no additional gaylords can be stacked on top and the space above the gaylord can be wasted. In other situations, gaylord boxes can remain open, preventing the stacking of other boxes / cargo on them. Consequently, a large space above the gay lord can remain open and could affect automated container loading analysis. Similar issues could arise with other large or irregularly shaped freight / goods. Accordingly, there exists a need for improved means for detecting and reporting container space usage. SUMMARY OF THE INVENTION According to the invention, a method is provided for estimating that a container is filled, the method comprising attaching an image recording device near a container loading area, the image recording device being operable to record three-dimensional images, recording via the image pickup device of a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data, including depth data, generating a histogram of the depth data from the three-dimensional image, and estimating the filled image are of the container based on at least in part the histogram. BE2018 / 5902 The container can for example be delimited by a floor, a ceiling, a first raised wall, a second raised wall and a third raised wall, the first raised wall being opposite and parallel to the second raised wall, the third raised wall being transverse on the first raised wall and the second raised wall. The container can preferably be further delimited by a door that is opposite and parallel to the third upright wall. The three-dimensional image can preferably be recorded over a field of view (FOV) that does not extend outside of an interior of the container. The act of estimating that the container is filled may include, for example, calculating an average depth value from the histogram and comparing the average value with a plurality of depth values each associated with a filled level of the container . Each filled level of the container can depend on at least one of a height, a width and a depth of the container. According to a further aspect of the invention, a system is provided for analyzing capacity of a container, comprising a container monitoring unit (CMU) mounted near a loading compartment, the CMU comprising a housing, an imaging assembly at least partially within the housing and operable to record a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data, including depth data; and a CMU controller communicatively connected to the display assembly, the controller operable to transmit the three-dimensional point data, and a guest computer communicatively connected to the CMU, the guest computer comprising a guest computer controller operable to the three-dimensional point data from the CMU controller, a BE2018 / 5902 generate histogram of the depth data from at least a portion of the three-dimensional point data and estimate a first being filled from the container based on at least in part the histogram. The container can for instance be delimited by a floor, a ceiling, a first raised wall, a second raised wall and a third raised wall, the first raised wall being opposite and parallel to the second raised wall, the third raised wall being transverse on the first raised wall and the second raised wall. The container can preferably be further delimited by a door that is opposite and parallel to the third upright wall. The three-dimensional image can preferably be recorded over a field of view (FOV) that does not extend outside of an interior of the container. The host computer controller may preferably be operative to estimate the first filledness of the container by calculating an average depth value of the histogram and comparing the average value with a plurality of depth values each associated with a first filled value. his level of the container. Each first filled level can, for example, be dependent on at least one of a height, a width and a depth of the container. The guest computer controller may be operative to estimate the first filledness of the container by calculating an average depth value of the histogram and basing the first filled at least partly on the average depth value, a second filled of the container estimate by identifying a first plurality of histogram peaks that fall within a threshold percentage of a highest histogram peak, identifying a peak of the plurality of histogram peaks that is farthest distance from the CMU, and basing the second being filled from provide the container at least partially at the farthest distance value, an indication of mixed load BE2018 / 5902 when, when loading the container, the first being filled from the container and the second being filled from the container diverge and converge along at least a portion of the container, and provide an indication of no mixed charge when, when the container is loaded, the first being filled from the container and the second being filled from the container diverge and not re-converge along at least the portion of the container. Basing the first filled at least in part on the average depth value may, for example, include comparing the average depth value with a first multiple number of depth values each associated with a first filled level of the container. Basing the use of the container on at least in part the farthest distance value may, for example, include comparing the farthest distance value with a second multiple number of distance values each associated with a usage level of the container. Basing the use of the container on at least in part the farthest distance value may include converting the farthest distance value into a percentage of a depth of the container and subtracting the 100% percentage. According to a further aspect of the invention, a container monitoring unit (CMU) is provided for analyzing capacity of a container, comprising a housing, an imaging assembly at least partially within the housing and operative to record a three-dimensional image representative of a three-dimensional formation, wherein the three-dimensional image has a multiple number of points with three-dimensional point data including depth data, and a CMU controller communicatively connected to the image assembly, the controller operative to generate a histogram of the depth data of at least a portion of the BE2018 / 5902 three-dimensional point data, and to estimate a container being filled, based at least in part on the histogram, via a histogram average-based calculation. The histogram average-based calculation may preferably include calculating an average depth value from the histogram and comparing the average value with a plurality of depth values each associated with a filled level of the container. The controller may preferably be operative to estimate the container being filled, based at least in part on the histogram, to provide via a histogram peak-based calculation, an indication of mixed load when the container is loaded filling the container estimated via the histogram average-based calculation and filling the container estimated via the histogram peak-based calculation diverge and converge along at least a portion of the container, and an indication of no mixed provide a charge when, when the container is loaded, the container's filledness estimated via the histogram average-based calculation and the container's filledness estimated via the histogram peak-based calculation diverge and re-converge, along at least one part of the container. The histogram average-based calculation may include, for example, calculating an average depth value from the histogram and filling the container estimated via the histogram average-based calculation based at least in part on the average depth value from the histogram, and where histogram is peak-based For example, by calculating a first multiple number of histogram peaks that fall within a threshold percentage of a highest histogram peak, identifying a peak of the first multiple number of histogram peaks that BE2018 / 5902 have farthest distance value from the CMU, and basing the container filled is estimated via the histogram peak-based calculation at least in part on the farthest distance value. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE FIGURES The accompanying figures, where like reference numerals refer to identical or functionally similar elements in the individual views, together with the figure description below, are incorporated into and form part of the description, and serve to further illustrate embodiments of concepts comprising the claimed invention and explain various principles and advantages of these embodiments. This shows FIG. 1 a loading facility in accordance with an embodiment of the invention, FIG. 2 an inside of the loading facility of FIG. 1 FIG. 3 a container monitoring unit in accordance with an embodiment of the present invention, FIG. 4A is a top view of the loading facility of FIG. 1 showing an example field of view of a container monitoring unit, FIG. 4B is a side view of the loading facility of FIG. 1 showing an example field of view of a container monitoring unit, FIG. 5 is a schematic exemplary block diagram of a communication network implemented in the facility of FIG. 1 FIG. 6 is a flowchart representative of a method for detecting the filling of a container in accordance with an embodiment of the present invention, FIGs. 7A and 7B examples of a partially loaded container, FIG. 8 is a depth value histogram of a portion of FIG. 7B, BE2018 / 5902 FIG. 9 is a graph representative of a container filled that has been calculated via a histogram average-based method and via a histogram peak-based method during various stages of loading a container in accordance with an embodiment of the present invention, FIG. 10 is a graph representative of a container filled that has been calculated via a histogram average-based method and via a histogram peak-based method during various stages of loading a container in accordance with an embodiment of the present invention. Elements in the figures are shown for simplicity and clarity and are not necessarily drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve the understanding of embodiments of the present invention. The device and method components are represented where it is suitable by conventional symbols in the figures, showing only those specific details that are relevant to the understanding of the embodiments of the present invention so as not to obscure the description with details provided on be easy to understand themselves. DETAILED DESCRIPTION OF THE INVENTION In this document, the term "container" will refer to any container that is transportable by a vehicle, a train, a vessel and / or an aircraft, and that is designed to store transportable goods such as packed and / or non-in packed items and / or other types of cargo. Accordingly, an example of a container includes an enclosed container that is non-removably attached to a wheeled platform and a hook for towing by a powered vehicle. An example of a container also includes an enclosure BE2018 / 5902 container that is removably attached to a platform with wheels and a towbar for towing by a powered vehicle. An example of a container also includes a casing that is non-removably attached to a frame of a powered vehicle, such as may be the case with a delivery truck, rigid truck, etc. Although the exemplary embodiments described below appear to be possible referring to one type of container, the scope of the invention extends to other types of containers, as defined above. In one embodiment, the present invention relates to a method for estimating that a container is filled. The method comprises: attaching an image pickup device near a container loading area, wherein the image pickup device is operable to record three-dimensional images; recording, via the image pickup device, a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; generating a histogram of the depth data from the three-dimensional image; and estimating the container being filled based on at least in part the histogram. In another embodiment, the present invention relates to a system for analyzing a container's loading. The system comprises: a container monitoring unit (CMU) mounted near a loading compartment, the CMU comprising: a housing; a display assembly at least partially within the housing and operative to receive a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; and a CMU controller that is communicatively connected to the display assembly, the controller operating around the BE2018 / 5902 three-dimensional point data. The system also includes a guest computer communicatively connected to the CMU, the guest computer including a guest computer controller that is operable to: receive the three-dimensional point data from the CMU controller; generate a histogram of the depth data from the three-dimensional point data; and estimating a first being filled from the container based at least in part on the histogram. In yet another embodiment, the present invention relates to a CMU for analyzing capacity of a container. The CMU comprises: a housing, an imaging assembly at least partially within the housing and operative to receive a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; and a CMU controller that is communicatively connected to the display assembly. The controller is operable to: generate a histogram of the depth data from at least a portion of the three-dimensional point data; estimate the container being filled, based at least in part on the histogram, via a histogram average-based calculation. With reference to the drawing, FIG. 1 an exemplary environment where embodiments of the present invention can be implemented. In the present example, the environment is provided in a form of a loading bay 100 (also referred to as a loading facility) where containers 102 are loaded with various goods and / or where various goods are unloaded from the containers 102. The loading platform 100 comprises a facility 104 which has a plurality of loading compartments 106.1-106.n facing a loading facility lot 108 where vehicles such as trailers (not shown), delivery and collection containers 102 can be placed. To be loaded, each container 102 is with its back to the facility 104 BE2018 / 5902 is turned such that it is generally transverse to the wall comprising the loading compartments 106, and in line with one of the loading compartments (in this case 106.3). As illustrated, each loading compartment 106 includes a compartment door 110 that can be lowered to close or lift the loading compartment 106 to open the loading compartment, thereby allowing the inside of the facility 104 to be accessed therethrough. In addition, each loading compartment 106 is provided with a container monitoring unit (CMU) 112. The CMU is mounted near the container loading area, preferably in the upper portion of the loading compartment 106 outside the door 110 facing the loading facility lot 108 or an inner-soft side of a container 102 if one is linked to the loading compartment. To protect the CMU from inclement weather it could be mounted under a tarpaulin cover 114. Once coupled, goods can be loaded to / unloaded from the container 102 with the CMU 112 keeping a view of the rear / inside of the container. FIG. 2 is an exemplary perspective view of the loading facility 104 of FIG. 1 as viewed from the inside, showing a container 102 that is coupled at a loading compartment 106.3 with an open container door and a container 116 that is coupled at a loading compartment 106.2 with a closed container door 118. To aid the status of the container door the CMU 112 is used as further described below. In the embodiment described here and as shown in FIG. 3, the CMU 112 is an attachable device that includes a 3D depth camera 120 for recording 3D (three-dimensional) images (e.g., 3D image data comprising a plurality of points with three-dimensional point data) and a 2D camera 122 for recording 2D images ( for example 2D image data). The 2D camera can be an RGB (red, green, blue) camera for recording 2D images. The CMU 112 can also include one or more processors and one or more BE2018 / 5902 computer memories for storing image data and / or for executing application / instructions that perform analysis or other functions as described herein. The CMU 112 may comprise, for example, flash memory for determining, storing or otherwise processing the image data and / or post-scan data. In addition, CMU 112 may further comprise a network interface to enable communication with other devices (such as server 130). The network interface of CMU 112 may include any suitable type of communication interface (s) (e.g., wired and / or wireless interfaces) that are arranged to operate in accordance with a suitable protocol. In various embodiments and as shown in FIGs. 1 and 2, the CMU 112 is mounted via a mounting bracket 124 and oriented toward the coupled containers to record 3D and / or 2D image data from the inside and outside thereof. In one embodiment, the 3D depth camera 120 comprises, for recording 3D image data, an infrared (IR) projector and a related IR camera. The IR projector projects a pattern of IR light or beams onto an object or surface, which may include surfaces of the container 102 (such as the door, walls, floor, etc.), objects inside the container (such as boxes, packages , temporary shipping requirements, etc.), and / or surfaces of the loading facility lot 108 (such as the surface of the loading facility lot on which the containers are parked). The IR handles or bundles can be distributed over the object or surface in a pattern of dots or dots by the IR projector, which can be measured or scanned by an IR camera. A deep detection application, such as a depth detection application that is performed on one or more processors or memories of CMU 112, can determine various depth values based on the pattern of dots or dots, for example depth values of the inside of the container 102. A near-depth object (for example, near boxes, packages, etc.) can be determined where BE2018 / 5902 the dots or points are close to each other and distant objects (for example, far boxes, packages, etc.) can be determined where the points are more spread out. The various depth values can be used by the deep detection application and / or CMU 112 to generate a depth map. The depth map can represent a 3D image of, or contain 3D image data from, the objects or surfaces measured or scanned by the 3D depth camera 120. Additionally, in an embodiment to record 2D image data, the 2D camera 122 includes an RGB (red, green, blue) based camera for recording 2D images that have RGB-based pixel data. In some embodiments, the 2D camera 122 records 2D images and related 2D image data at the same or similar point in time as the 3D depth camera 120 such that the CMU 112 may have both sets of 3D image data and 2D image data available for a specific surface, object or scene at the same or comparable time. Referring to FIGs. 4A and 4B, the CMU can be oriented such that the fields of view (FOV) 126 for the 3D camera and the 2D camera diverge to accommodate a majority of the inside of the container 102. Additionally, both FOVs can overlap considerably to record data across essentially the same area. Accordingly, the CMU 112 can scan, measure, or otherwise record image data from the walls, floor, ceiling, packages, or other object or surfaces within the container to determine the 3D and 2D image data. Similarly, when a container is absent from the loading bay, the CMU can scan, measure or otherwise record image data from the loading facility lot 108 surface to determine the 3D and 2D image data. The image data can be processed by the one or more processors and / or memories of the CMU 112 (or in some embodiments one or more remote processors and / or memories of a server) to implement analysis, functions, such as graphic or image analysis , as described by the one or BE2018 / 5902 several various flowcharts, block diagrams, methods, functions or various embodiments herein. In some embodiments, the CMU 112 processes the 3D and 2D image data for use by other devices (e.g., client device 128 (which may be in the form of a mobile device, such as a tablet, smartphone, laptop, or other such mobile computer system) or server 130 (which may be in the form of a single or multiple computers that operate to control access to a centralized resource or service in a network)). Processing the image data may generate post-scan data that includes metadata, simplified data, normalized data, result data, status data, or alertness data as determined from the originally scanned or measured data. As shown in FIG. 5, which illustrates a block connection diagram between the CMU 112, server 130, and client device 128, these devices can be connected via a suitable communication means, including wired and / or wireless connection components that can implement one or more communication protocol standards, such as, for example, TCP / IP IP, WiFi (802.11b), Bluetooth, Ethernet, or any other suitable communication protocol or standard. In some embodiments, the server 130 may be located at the same loading facility 104. In other embodiments, the server 130 may be located at a remote location, such as on a cloud platform or other remote location. In still other embodiments, the server 130 may be formed from a combination of local and cloud-based computers. Server 130 is arranged to execute computer instructions to perform operations associated with the systems and methods described herein. The server 130 can implement enterprise service software that includes, for example, RESTful (representative state transfer) API services, message queuing service and event services BE2018 / 5902 that can be provided by various platforms or specifications, such as the J2EE specification implemented by the Oracle WebLogic Server platform, the JBoss platform or the IBM WebSphere platform, etc. Other technologies or platforms, such as Ruby on Rails, Microsoft .NET, or similar, can also be used. To assist with the reporting of use of space, the above components can be used, alone or in combination, to detect and / or provide various measurements from the inside of the container coupled to a loading compartment and to perform these measurements (i.e., data) to use to perform the necessary analyzes. Furthermore, reference is made to FIG. 6 showing a flow chart representative of an exemplary method 200 for estimating that a container is filled. At step 202, the method includes the operation of attaching an image pickup device near a container loading area, the image pickup device being operational to record three-dimensional images. The image recording device can be implemented via the CMU 112 which is arranged to record 3D images. It is preferable to orient the image pickup device such that its 3D FOV extends over the area of the loading facility lot and more specifically over the area where a container (such as container 102) is expected to be positioned during loading and unloading procedures. This configuration allows the image pickup device (by recording and analyzing 3D data) to detect the presence or absence of various objects in the vicinity of its FOV and to make various determinations based on them. Next, the method at step 204 includes processing of recording, via the image recording device, of a three-dimensional image representative of a three-dimensional formation, the BE2018 / 5902 three-dimensional image has a multiple number of points with three-dimensional point data including depth data. In one embodiment, the 3D camera of the image pickup device senses the depth of all the points within its FOV and assigns several depth values thereto, thereby building up a point cloud that can be referred to as representative of the environment within its FOV. In addition to the depth measurements, the image pickup device can assign horizontal and vertical position data to each of the points, thereby creating three-dimensional point data for each of the recorded points. In some cases, the image pickup device may not be able to sense anything in at least some portions of its FOV. This may occur, for example, if a portion of the environment is outside the depth detection range of the image capture device or if, in the case that an IR-based 3D camera is used, there is a problem with measuring IR reflections. In this case, three-dimensional point data associated with points that cannot be measured can include data representative of an absence of an object or a non-presence of 3D data (which can be interpreted as an absence of a detectable object). It is noted here that by evaluating the 3D image, it may be possible to deduce the use of space within the container. In some implementations, this can be done by evaluating the depth values of at least some of the included points in the 3D image and correlating these values with filled levels, as described further below. The actual computational analysis can take place on a server, such as the server 130 that receives the three-dimensional point data from the image capture device, the image capture device itself, or a combination thereof. Upon receiving / obtaining the three-dimensional point data comprising the depth data, the method 200 includes step BE2018 / 5902 206 the operation of generating a histogram of the depth data. When generating the depth histogram, it is preferable to select points within the three-dimensional image that represent either the rear wall of the container or the surfaces of goods present in the container. This is important because while the FOV of the 3D camera remains static, surfaces that generate 3D point data that contribute to internal use of space are dynamic. To further explain this, reference is made to FIGs. 7A and 7B. FIG. 7A illustrates an image 300 of an interior of a container where tip 302 is representative of a point on the floor of the container and thus its depth value has no particular effect on the inner use of space. On the other hand, a point 402 (located in the same direction as point 302) is in the image 400 of FIG. 7B occupied by a surface of a box 404 which does have an effect on the inner use of space. Therefore, in some embodiments, points used for histogram generation are taken from surfaces substantially parallel to a container loading door near where the image pickup device is located. Also, in some embodiments, points used for histogram generation are taken from surfaces that are within 30 degrees normally on a line extending from a point on the plane to the image pickup device. By limiting the points for the histogram in these or other ways, the histogram can represent a distribution of depth values of points associated with articles that have an effect on the global used and remaining capacities. The determination of a flat surface by itself can be carried out via 3D image segmentation analysis. In some embodiments, sample consensus (SAO) segmentation analysis can be used to determine points in the 3D image data associated with different planes or surfaces. This can be applied to a wide variety of surfaces, including inside and outside BE2018 / 5902 surfaces of the container (e.g. inner walls, floor, ceiling and outer surfaces such as the outside of the door) and also surfaces of objects that are located in the container itself. SAC segmentation analysis determines or segments the different planes or surfaces of the environment into x, y, z coordinate planes by identifying a correlation of common points oriented along x, y, z planes within the 3D image data. As such, this method can be used to analyze points within the 3D image and identify a presence of planes corresponding to various surfaces. Additionally, inspection of orientation of the planes (and thus the detected surfaces) can be performed to limit the points considered for the histogram. Relevant planes can, for example, be limited to those that have a maximum Δζ for a given xi, X2 or a given yi, y2. Upon obtaining the correct histogram, further analyzes can be performed on it to provide usage estimates. Referring to FIG. 8, a histogram of depth values associated with those of FIG. 7B. From this data one can estimate the filling of the container by calculating an average depth value from the histogram and comparing the average value with a multiple number of depth values that are each associated with a filled level of the container. This operation can be performed, for example, by the server at step 208 of the method 200. Calculating the average depth value of the histogram of FIG. 8 yields a value which can then be used to determine the estimated filling of the container. In an implementation, this is done by looking up which filled amount corresponds to the average depth value in a look-up table. In another implementation, this calculation further depends on the global one BE2018 / 5902 dimensions of the container (at least one of a height, a width and a depth of the container) since an average depth value W of the image pickup device will indicate an amount of being filled that varies due to the global area that can be filled. Although the above embodiments have been described in terms of estimating being filled by calculating average depth values from the histogram (also referred to as "histogram average-based method (s)"), in other implementations it can be filled are determined by histogram peak selection and evaluation (also referred to as "histogram peak-based method (s)"). In an exemplary histogram peak-based method, the determination of being filled includes identifying from the generated histogram a first multiple number of histogram peaks that fall within a threshold percentage of a highest histogram peak, identifying a peak from the first multiple number of histogram peaks that has a farthest distance value from the CMU and using the further distance in determining being filled. In one implementation, the determination of being filled is performed by comparing the farthest distance value with a plurality of distance values that are each associated with a level of filling of the container. In another implementation, the farthest distance is converted to a percentage of the total container depth and then subtracted from 100% to provide a percentage of the container that is estimated to be filled. Since the histogram average-based method and the histogram peak-based method are based on different aspects of the histogram, they will often yield different filled levels. With respect to the graph 500 of FIG. 9, for example, unlike the container 502 that is estimated via a histogram average-based method, the BE2018 / 5902 container-filled 504 are estimated via the histogram peak-based method at or near 0% despite boxes being loaded into the container. This occurs because the rear wall of the container (above all loaded boxes) remains within the view of the camera. Due to the size of this wall, it will register as a relatively large peak on the histogram and, assuming a sufficient threshold, will be selected for determining it. Although estimates determined by the histogram peak-based method (s) may not seem accurate, their benefit becomes more remarkable when combined with estimates determined by the histogram average-based method (s). Using both methods can, for example, help determine whether mixed loading (loading of gaylord and / or gaylord-like boxes and / or cargo) has occurred. When mixed loading occurs, as illustrated in FIG. 9, the container-filled metric obtained via a histogram peak-based method is likely to remain at a relatively low value while the container-filled metric obtained via a histogram average-based method is likely to increase. On the other hand, as shown in the graph 600 of FIG. 10, when mixed loading does not occur, the container-filled metric 602 obtained via a histogram peak-based method and the container-filled metric 604 obtained via a histogram average-based method repeatedly during the loading process . In one embodiment, convergence can be seen as being within 10% of the global load amount and divergence can be seen as being above 10% of the global load amount. In chart 600 it can be seen that convergence occurs in, for example, area 606 and it can be seen that divergence occurs in, for example, area 608. Based on this, a determination of mixed load can be made when the container is filled as estimated by the histogram peak BE2018 / 5902-based method and container filling estimated by the histogram average-based method diverge and converge, and a determination of no mixed load can be made when container filling is estimated by the peak-based histogram method and filling have been estimated by the histogram-average based method to diverge and not re-converge. Specific embodiments have been described in the foregoing description. Various modifications and changes can be made without departing from the scope of the invention as set forth in the claims below. Accordingly, the specification and figures are to be considered in an illustrative manner rather than a restrictive manner and all such modifications are intended to be included within the scope of the present teachings. In addition, the described embodiments / examples / implementations should not be interpreted as mutually exclusive, but instead may already be combined if such combinations are possible in any way. In other words, any described measure in one of the above embodiments / examples / implementations can be included in another embodiment / example / implementation. In addition, none of the steps of a method described herein have a specific sequence unless it is expressly stated that no other sequence is possible or necessary for the remaining steps of the method. The benefits, solutions to problems, and any element that can lead to any benefit or solution that occurs or is more emphatically present should not be interpreted as critical, required, or essential measures or elements of any or all of the claims. The invention is only defined by the appended claims including adjustments made during the grant phase of this application and all equivalents of these claims as described. For the purpose of BE2018 / 5902 clarity and a brief description are features described herein as part of the same or separate embodiments. It is noted, however, that the scope of the invention may include embodiments that include combinations of all or some of the features described. It is further noted that the embodiments shown have the same or similar components, unless they are described as being different. In addition, in this document, relative terms such as first and second, top and bottom and the like can only be used to distinguish one entity or action from another entity or action without necessarily requiring or requiring a genuine such relationship or order between such entities or action. imply. The terms "includes," "includes," "has," "has," "contains," "contains," or any other variation thereof is intended to cover a non-exclusive inclusion such that a process, method, article, or device that includes a list of elements includes, includes, and may include other elements that are not expressly enumerated or inherent in such a process, method, article, or device. An element preceded by "includes ... a", "has ... a", "contains ... a" does not exclude, without limitation, the existence of additional identical elements in the process, method, article , or the device comprising the element. The term "one" is defined as one or more, unless explicitly stated otherwise herein. The terms "substantially", "essential", "approximately", or any other version thereof, are defined as being close by, as understood by those skilled in the art, and in a non-limiting embodiment, the term is defined as being within 10%, in another embodiment as being within 5%, in another embodiment as being within 1% and in another embodiment as being within 0.5%. The term "linked" as used herein is defined as connected, although not necessarily directly and not BE2018 / 5902 necessarily mechanical. A device or structure that is "configured" in a certain way is configured in at least that way, but can also be configured in ways that are not specified. Some embodiments may include one or more generic or specialized processors (or "processing devices"), such as microprocessors, digital signal processors, customized processors, and field-programmable gate arrays (FPGAs) and uniquely stored program instructions (including both software and hardware) that include one or more control multiple processors, in combination with certain non-processor circuits, to implement some, most, or all functions of the method and / or device described herein. Alternatively, some or all of the functions could be implemented by a state machine that has no stored program instructions, or in one or more application-specific integrated circuits (ASICs), in which each function or some combinations of certain functions are implemented as custom logic. A combination of the two approaches could of course be used. In addition, an embodiment may be implemented as a computer-readable storage medium having a computer-readable code stored thereon for programming a computer (e.g., including a processor) to perform a method as described and claimed herein. Examples of such computer-readable storage media include, but are not limited to, a hard disk, a CD-ROM, an optical storage device, a magnetic storage device, a ROM (read-only memory), a PROM (programmable read-only memory) ), an EPROM (erasable programmable read-only memory), an EEPROM (electrically erasable programmable read-only memory) and a flash memory. The skilled person will, despite potentially significant effort and many design choices motivated by, BE2018 / 5902 for example, available time, current technology, and economic considerations, when guided by the concepts and principles described herein will easily be able to generate such software instructions and programs and ICs with minimal experimentation. The summary is provided to enable the reader to quickly find out the nature of the technical description. This is submitted on the assumption that it will not be used to interpret the claims or to limit their scope. In addition, in the foregoing figure description, various measures may be grouped together in various embodiments for the purpose of streamlining the description. This manner of description should not be interpreted as a representation of an intention that the claimed embodiments require more measures than explicitly mentioned in each claim. Rather, as the following claims show, the inventive subject matter is in less than all of the features of a described embodiment. For example, the following claims are added to the description of the figures, each claim standing on its own as a separately claimed matter. The mere fact that certain measures are named in mutually different claims does not indicate that a combination of those measures cannot be used for an advantage. A large number of variants will be clear to the skilled person. All variants are considered to be included within the scope of the invention as defined in the following claims.
权利要求:
Claims (20) [1] CONCLUSIONS A method for estimating that a container is filled, the method comprising: attaching an image pickup device near a container loading area, the image pickup device being arranged to operatively record three-dimensional images; recording, via the image pickup device, a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; generating a histogram of the depth data from the three-dimensional image; and estimating the container being filled at least in part based on the histogram. [2] Method according to claim 1, wherein the container is bounded by a floor, a ceiling, a first standing wall, a second standing wall, and a third standing wall, wherein the first standing wall is opposite and parallel to the second standing wall, the third standing wall is perpendicular to the first standing wall and the second standing wall. [3] Method according to claim 2, wherein the container is further limited by a door that is opposite and parallel to the third standing wall. [4] A method according to any one of the preceding claims, wherein the three-dimensional image is taken with a field of view that does not extend outside of an inside of the container. [5] The method of any one of the preceding claims, wherein the operation of estimating that the container is filled is the BE2018 / 5902 includes calculating an average depth value from the histogram and comparing the average value with a multiple number of depth values each associated with a level of filling of the container. [6] The method of claim 5, wherein each of levels of filling of the container is dependent on at least one of the following: a height, a width, and a depth of the container. [7] A system for analyzing capacity of a container, comprising: a container monitoring unit (CMU) mounted near a loading bay, the CMU comprising: a housing; an imaging assembly at least partially within the housing and operative to receive a three-dimensional image representative of a three-dimensional formation, the three-dimensional image having a plurality of points with three-dimensional point data including depth data; and a CMU controller that is communicatively connected to the image assembly, the controller being arranged to operatively transmit the three-dimensional point data; and a guest computer that is communicatively connected to the CMU, wherein the guest computer comprises a guest computer controller that is operably: receive the three-dimensional point data from the CMU controller, generate a histogram of the depth data of at least a portion of the three-dimensional point data; and estimating a first being filled of the container at least in part based on the histogram. BE2018 / 5902 [8] 8. System as claimed in claim 7, wherein the container is bounded by a floor, a ceiling, a first standing wall, a second standing wall, and a third standing wall, wherein the first standing wall is opposite and parallel to the second standing wall, wherein the third standing wall is transverse to the first standing wall and the second standing wall. [9] The system of claim 8, wherein the container is further delimited by a door that is opposite and parallel to the third standing wall. [10] The system of claim 8 or 9, wherein the three-dimensional image is recorded with a field of view that does not extend outside of an inside of the container. [11] The system of any one of the preceding claims 8-10, wherein the guest computer controller is operatively estimated to be the first filled of the container by calculating an average depth value from the histogram and comparing the average value with a multiple number depth values each associated with a first level of filling of the container. [12] The system of claim 11, wherein each first level of filling of the container is dependent on at least one of the following: a height, a width, and a depth of the container. [13] The system of any one of the preceding claims 8-12, wherein the guest computer controller is operational to: estimate the first being filled from the container by calculating an average depth value from the histogram and the first being filled based at least in part on the average depth value, estimating a second being filled from the container by identifying a first multiple number of histogram peaks that fall within a threshold percentage of a highest histogram peak, identifying a peak from the first multiple number of histogram peaks that have a farthest distance value from the CMU, and BE2018 / 5902 basing the second being filled of the container at least partially on the farthest distance value, providing an indication for mixed loading when, when loading the container, the first are filled from the container and the second are filled of the container diverging and converging along at least a portion of the container, and providing an indication of no mixed load when, when loading the container, the first is filled from the container and the second is filled from the container diverge and not re-converge, along the at least one portion of the container. [14] The system of claim 13, wherein basing the first being filled on at least in part the average depth value comprises comparing the average depth value with a first multiple number of depth values each associated with a first level of filling of the container. [15] The system of claim 14, wherein basing the use of the container at least partially on the farthest distance value comprises comparing the farthest distance value with a second multiple number of distance values each associated with a usage level of the container. [16] The system of claim 14 or 15, wherein basing the use of the container at least in part on converting the farthest distance value to converting the farthest distance value to a depth value percentage of the container and subtracting the 100% percentage. [17] 17. Container monitoring unit (CMU) for analyzing capacity of a container, comprising: a housing; a display assembly at least partially within the housing and adapted to operationally record a three-dimensional image that BE2018 / 5902 is representative of a three-dimensional formation, wherein the three-dimensional image has a multiple number of points with three-dimensional point data comprising depth data; and a CMU controller that is communicatively connected to the image assembly, the controller being operatively arranged to: generate a histogram of the depth data from at least a portion of the three-dimensional point data; and estimating that the container is filled, based at least in part on the histogram, via an average-based histogram calculation. [18] The CMU of claim 17, wherein the average-based histogram calculation comprises calculating an average depth value from the histogram and comparing the average value with a plurality of depth values each associated with a level of filling of the container. [19] The CMU according to claim 17 or 18, wherein the controller is arranged to operationally: to estimate a being filled from the container, based at least in part on the histogram, via a peak-based histogram calculation, to provide an indication of mixed loading when, when loading the container, the filling of the container estimated via the average-based histogram calculation and the container filling are estimated to diverge and converge via the peak-based histogram calculation, along at least a portion of the container, and provide an indication of a non-mixed charge when, when loading the container, being filled from the container estimated via the average-based histogram calculation and being filled from the container estimated via the peak-based histogram calculation diverging and re-converging along at least a portion of the container. BE2018 / 5902 [20] The CMU of claim 19, wherein the average-based histogram calculation comprises calculating an average depth value from the histogram and basing the container's filledness estimated via the average-based histogram calculation. 5 at least in part the average depth value from the histogram, and wherein the peak-based histogram calculation, by identifying a first multiple number of histogram peaks that fall within a threshold percentage of a highest histogram peak, comprises identifying a peak from the multiple number of histogram peaks That 10 has a farthest distance value from the CMU, and basing the container being filled estimated via the peak-based histogram calculation at least in part the farthest distance value.
类似技术:
公开号 | 公开日 | 专利标题 BE1025929B1|2020-01-06|Container usage estimate BE1025931B1|2020-04-01|Systems and methods for determining the filling of a commercial trailer AU2018391965B2|2021-11-25|Container loading/unloading time estimation AU2018391957B2|2021-07-01|Computing package wall density in commercial trailer loading BE1025930B1|2019-11-27|Container Auto-dimensioning NL2022243B1|2021-09-23|Trailer door monitoring and reporting US10841559B2|2020-11-17|Systems and methods for detecting if package walls are beyond 3D depth camera range in commercial trailer loading KR101752456B1|2017-07-03|Method and system for analyzing quantity of load based on image information US10922830B2|2021-02-16|System and method for detecting a presence or absence of objects in a trailer US11009604B1|2021-05-18|Methods for detecting if a time of flight | sensor is looking into a container US20210304414A1|2021-09-30|Methods for unit load device | door tarp detection US20210042982A1|2021-02-11|System and method for robust depth calculation with tof sensors using multiple exposure times US20210256682A1|2021-08-19|Three-dimensional | imaging systems and methods for detecting and dimensioning a vehicle storage area US20210398308A1|2021-12-23|Methods for calculating real time package density US20210248381A1|2021-08-12|Method for Detecting Trailer Status Using Combined 3D Algorithms and 2D Machine Learning Models US20210264630A1|2021-08-26|Three-dimensional | imaging systems and methods for virtual grading of package walls in commerical trailer loading CN113469871A|2021-10-01|Carriage loadable space detection method and device based on three-dimensional laser
同族专利:
公开号 | 公开日 PL434590A1|2021-08-16| DE112018006533T5|2020-09-03| CN111512314A|2020-08-07| WO2019125629A1|2019-06-27| BE1025929A1|2019-08-09| US10692236B2|2020-06-23| US20190197719A1|2019-06-27|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US20170140550A1|2015-11-18|2017-05-18|Symbol Technologies, Llc|Methods and systems for automatic fullness estimation of containers| US7353181B2|2001-08-15|2008-04-01|Hewlett-Packard Development Company, L.P.|Allocating freight haulage jobs| US7570786B2|2004-08-30|2009-08-04|Antoun Ateya|Automatic digital object counting and verification system and associated method| US7813540B1|2005-01-13|2010-10-12|Oro Grande Technologies Llc|System and method for detecting nuclear material in shipping containers| US9045280B2|2006-01-05|2015-06-02|Alex Bellehumeur|Collapsible storage container| US20140372183A1|2013-06-17|2014-12-18|Motorola Solutions, Inc|Trailer loading assessment and training| US9460524B1|2014-05-30|2016-10-04|Amazon Technologies, Inc.|Estimating available volumes using imaging data| JP6350657B2|2014-06-13|2018-07-04|株式会社ニコン|Shape measuring device, structure manufacturing system, shape measuring method, structure manufacturing method, shape measuring program, and recording medium| US10713610B2|2015-12-22|2020-07-14|Symbol Technologies, Llc|Methods and systems for occlusion detection and data correction for container-fullness estimation|US10733799B2|2017-07-26|2020-08-04|Daqri, Llc|Augmented reality sensor| US11010915B2|2019-07-11|2021-05-18|Zebra Technologies Corporation|Three-dimensionaldepth imaging systems and methods for dynamic container auto-configuration| US10762331B1|2019-10-11|2020-09-01|Zebra Technologies Corporation|Three-dimensionaldepth and two-dimensionalimaging systems and methods for automatic container door status recognition| US10991116B1|2019-10-25|2021-04-27|Zebra Technologies Corporation|Three-dimensionaldepth imaging systems and methods for automatically determining shipping container fullness based on imaging templates| US20210304414A1|2020-03-24|2021-09-30|Zebra Technologies Corporation|Methods for unit load devicedoor tarp detection| CN113469871A|2020-03-30|2021-10-01|长沙智能驾驶研究院有限公司|Carriage loadable space detection method and device based on three-dimensional laser| CN112078672A|2020-09-06|2020-12-15|广州潘娜娜信息技术有限公司|Transportation system for counting and displaying loading condition of vehicle-mounted cage in real time|
法律状态:
2020-02-05| FG| Patent granted|Effective date: 20200106 |
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US15/853,223|US10692236B2|2017-12-22|2017-12-22|Container use estimation| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|